MOTIF-Driven Contrastive Learning of Graph Representations

نویسندگان

چکیده

We propose a MOTIF-driven contrastive framework to pretrain graph neural network in self-supervised manner so that it can automatically mine motifs from large datasets. Our achieves state-of-the-art results on various graph-level downstream tasks with few labels, like molecular property prediction.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Contrastive Learning of Emoji-based Representations for Resource-Poor Languages

The introduction of emojis (or emoticons) in social media platforms has given the users an increased potential for expression. We propose a novel method called Classification of Emojis using Siamese Network Architecture (CESNA) to learn emoji-based representations of resource-poor languages by jointly training them with resource-rich languages using a siamese network. CESNA model consists of tw...

متن کامل

Learning Aspect Graph Representations from View Sequences

In our effort to develop a modular neural system for invariant learning and recognition of 3D objects, we introduce here a new module architecture called an aspect network constructed around adaptive axo-axo-dendritic synapses. This builds upon our existing system (Seibert & Waxman, 1989) which processes 20 shapes and classifies t.hem into view categories (i.e ., aspects) invariant to illuminat...

متن کامل

A Boosting Approach to Learning Graph Representations

Learning the right graph representation from noisy, multisource data has garnered significant interest in recent years. A central tenet of this problem is relational learning. Here the objective is to incorporate the partial information each data source gives us in a way that captures the true underlying relationships. To address this challenge, we present a general, boosting-inspired framework...

متن کامل

Learning Graph Representations with Embedding Propagation

Label Representations • Let l ∈ Rd be the representation of label l, and f be a differentiable embedding function • For labels of label type i, we apply a learnable embedding function l = fi(l) • hi(v) is the embedding of label type i for vertex v: hi(v) = gi ({l | l ∈ labels of type i associated with vertex v}) • h̃i(v) is the reconstruction of the embedding of label type i for vertex v: h̃i(v) ...

متن کامل

Learning Deep Representations for Graph Clustering

Recently deep learning has been successfully adopted in many applications such as speech recognition and image classification. In this work, we explore the possibility of employing deep learning in graph clustering. We propose a simple method, which first learns a nonlinear embedding of the original graph by stacked autoencoder, and then runs k-means algorithm on the embedding to obtain cluster...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Proceedings of the ... AAAI Conference on Artificial Intelligence

سال: 2021

ISSN: ['2159-5399', '2374-3468']

DOI: https://doi.org/10.1609/aaai.v35i18.17986